Balancing stability and bias reduction in variable selection with the Mnet estimator

نویسندگان

  • Jian Huang
  • Patrick Breheny
  • Sangin Lee
  • Shuangge Ma
  • Cun-Hui Zhang
چکیده

We propose a new penalized approach for variable selection using a combination of minimax concave and ridge penalties. The proposed method is designed to deal with p ≥ n problems with highly correlated predictors. We call the propose approach the Mnet method. Similar to the elastic net of Zou and Hastie (2005), the Mnet also tends to select or drop highly correlated predictors together. However, unlike the elastic net, the Mnet is selection consistent and equal to the oracle ridge estimator with high probability under reasonable conditions. We develop an efficient coordinate descent algorithm to compute the Mnet estimates. Simulation studies show that the Mnet has better performance in the presence of highly correlated predictors than either the elastic net or MCP. Finally, we illustrate the application of the Mnet to real data from a gene expression study in ophthalmology.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel

One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of  kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...

متن کامل

Spatial Regression in the Presence of Misaligned data

In this paper, four approaches are presented to the problem of fitting a linear regression model in the presence of spatially misaligned data. These approaches are plug-in method‎, ‎simulation‎, ‎regression calibration and maximum likelihood‎. In the first two approaches‎, ‎with modeling the correlation between the explanatory variable, prediction of explanatory variable is determined at sites...

متن کامل

Bayesian inference for high-dimensional linear regression under mnet priors

Abstract: For regression problems that involve many potential predictors, the Bayesian variable selection (BVS) method is a powerful tool, which associates each model with its posterior probabilities, and achieves superb prediction performance through Bayesian model averaging (BMA). Two challenges of using such models are, specifying a suitable prior, and computing posterior quantities for infe...

متن کامل

Jackknifed Liu-type Estimator in Poisson Regression Model

The Liu estimator has consistently been demonstrated to be an attractive shrinkage method for reducing the effects of multicollinearity. The Poisson regression model is a well-known model in applications when the response variable consists of count data. However, it is known that multicollinearity negatively affects the variance of the maximum likelihood estimator (MLE) of the Poisson regressio...

متن کامل

Comment Donglin Zeng

Cattaneo, Grump, and Jansson (2013) present an interesting estimator, namely the generalized jackknife estimator, for estimating weighted average derivatives. Starting with a high-order (in this case, second-order) linearization of the estimating equation, they obtain the asymptotic approximation under a weak bandwidth selection which does not require the standard convergence rate of the nonpar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013